66 research outputs found

    Towards Interference Aware IoT Framework: Energy and Geo-location based Modelling

    Get PDF
    In multi-hop wireless communication, a sensor node must consume its energy efficiently for relaying data packets. However, most of the IoT-devices are equipped with limited battery power and computing resources for wireless communication thus energy optimization becomes one of the major concerns in wireless sensors routing design. The wireless technologies usually use unlicensed frequency bands of 2.4 GHz to transmit the data. Due to the broadcasting medium, the wireless transmission interferes with the reception of surrounding radios. As a result, data transmission failure increases and thus low communication quality. Therefore, one of the best solutions of this problem is to select the hop distance node that has a few neighbour nodes to disseminate a packet until it reaches the ultimate receiver. The new routing selects the node that has few neighbouring nodes and thus less interference. In another word, the scheme finds a better load balancing and thus minimizes the probability of overload on a sensor node. It also introduces a new clustering algorithm around a single base station that could shorten the transmission distances. This approach periodically selects the cluster heads (CHs) according to its location from the final destination. Extensive simulation studies reveal that the proposed algorithm finds the best routing technique and clustering formation to forward the traffic and thereby minimizes the interference ratio. In addition, the proposed protocol achieves low energy consumption and longer network lifetime than other popular protocols

    Review of the state of the art of deep learning for plant diseases: a broad analysis and discussion

    Get PDF
    Deep learning (DL) represents the golden era in the machine learning (ML) domain, and it has gradually become the leading approach in many fields. It is currently playing a vital role in the early detection and classification of plant diseases. The use of ML techniques in this field is viewed as having brought considerable improvement in cultivation productivity sectors, particularly with the recent emergence of DL, which seems to have increased accuracy levels. Recently, many DL architectures have been implemented accompanying visualisation techniques that are essential for determining symptoms and classifying plant diseases. This review investigates and analyses the most recent methods, developed over three years leading up to 2020, for training, augmentation, feature fusion and extraction, recognising and counting crops, and detecting plant diseases, including how these methods can be harnessed to feed deep classifiers and their effects on classifier accuracy

    Physics-informed radial basis network (PIRBN): A local approximation neural network for solving nonlinear PDEs

    Full text link
    Our recent intensive study has found that physics-informed neural networks (PINN) tend to be local approximators after training. This observation leads to this novel physics-informed radial basis network (PIRBN), which can maintain the local property throughout the entire training process. Compared to deep neural networks, a PIRBN comprises of only one hidden layer and a radial basis "activation" function. Under appropriate conditions, we demonstrated that the training of PIRBNs using gradient descendent methods can converge to Gaussian processes. Besides, we studied the training dynamics of PIRBN via the neural tangent kernel (NTK) theory. In addition, comprehensive investigations regarding the initialisation strategies of PIRBN were conducted. Based on numerical examples, PIRBN has been demonstrated to be more effective and efficient than PINN in solving PDEs with high-frequency features and ill-posed computational domains. Moreover, the existing PINN numerical techniques, such as adaptive learning, decomposition and different types of loss functions, are applicable to PIRBN. The programs that can regenerate all numerical results can be found at https://github.com/JinshuaiBai/PIRBN.Comment: 48 pages, 26 figure

    Bioinformatics: Computational Approaches for Genomics and Proteomics

    Get PDF
    Bioinformatics is a fast evolving field that combines biology, computer science, and statistics to analyze and comprehend enormous volumes of biological data. As a result of the introduction of high-throughput technologies like next-generation sequencing and mass spectrometry, genomic and proteomics research has generated enormous volumes of data, necessitating the development of computational tools to process and extract useful insights from these datasets. This presentation presents a survey of computational approaches in bioinformatics with a particular emphasis on their application to genomics and proteomics. The study of the entire genome is a topic covered in the discipline of genomics, which also includes genome annotation, assembly, and comparative genomics. Proteomics focuses on the investigation of proteins, including their identification, quantification, structural analysis, and functional characterization. Consequently, the importance of the area of bioinformatics has increased

    Automated masks generation for coffee and apple leaf infected with single or multiple diseases-based color analysis approaches

    Get PDF
    Identification of plant disease is affected by many factors. The scarcity of rare or mild symptoms, the sensitivity of segmentation is influenced by light and shadow of images capturing conditions, and symptoms characteristics are represented by multiple lesions of varied colours on the same leaf at different stages of infection. Traditional approaches face several problems: contrast handling leads to mild symptoms being undetected and deals with edges results in curved surfaces and veins being considered new regions of interest. Thresholding of segmentation restricts it to a specific range of values, which prevents it from dealing with an entire area (healthy, injured, or noise). Deep learning approaches also face problems of dealing with imbalanced datasets. The existence of overlapped symptoms on the same leaf sample is rare. Most deep models detect a single type of lesion at a single time. Masks with a single type of infection are used for training these models that lead to misclassification. Manual annotation of symptoms is considered time-consuming. Therefore, the proposed framework in this study is an attempt to overcome certain drawbacks of traditional segmentation approaches to generate masks for deep disease classification models. The main objective is to label datasets based on a semi-automated segmentation of leaves and disordered regions. There is no need to manage contrast or apply filters that keep lesion characteristics unchanged. As a result, every pixel in the predetermined lesions is selected accurately. The approach is applied to three different datasets with single and multiple infections. The obtained overall precision is 90%. The average intersection over the union of the injured regions is 0.83. The brown and the dark brown lesions are more accurately segmented than the yellow lesions

    Quantum Computing: Algorithms,Architectures, and Applications

    Get PDF
    Cryptography, optimization, simulation, and machine learning are just a few of the industries that might be completely transformed by quantum computing. This abstract gives a thorough introduction to quantum computing with an emphasis on its algorithms, architectures, and applications. In conclusion, this abstract offers an in-depth analysis of quantum computing, including its algorithms, structures, and applications. It highlights the revolutionary potential of quantum computing in tackling difficult issues that are beyond the scope of conventional computers, laying the groundwork for further research and understanding of this quickly developing topic

    Parallel and Distributed Computing for High-Performance Applications

    Get PDF
    The study of parallel and distributed computing has become an important area in computer science because it makes it possible to create high-performance software that can effectively handle challenging computational tasks. In terms of their use in the world of high-performance applications, parallel and distributed computing techniques are given a thorough introduction in this study. The partitioning of computational processes into smaller subtasks that may be completed concurrently on numerous processors or computers is the core idea underpinning parallel and distributed computing. This strategy enables quicker execution times and enhanced performance in general. Parallel and distributed computing are essential for high-performance applications like scientific simulations, data analysis, and artificial intelligence since they frequently call for significant computational resources. High-performance apps are able to effectively handle computationally demanding tasks thanks in large part to parallel and distributed computing. This article offers a thorough review of the theories, methods, difficulties, and developments in parallel and distributed computing for high-performance applications. Researchers and practitioners may fully utilize the potential of parallel and distributed computing to open up new vistas in computational science and engineering by comprehending the underlying concepts and utilizing the most recent breakthroughs

    Computational Intelligence for Solving Complex Optimization Problems

    Get PDF
    Complex optimization issues may now be solved using computational intelligence (CI), which has shown to be a powerful and diverse discipline. Traditional optimization approaches frequently struggle to offer efficient and effective solutions because real-world situations are becoming more complicated. Evolutionary algorithms, neural networks, fuzzy systems, and swarm intelligence are just a few examples of the many methods that fall under the umbrella of computational intelligence and are inspired by both natural and artificial intelligence. This abstract examines how computational intelligence techniques are used to solve complicated optimization issues, highlighting their benefits, drawbacks, and most recent developments. In this, computational intelligence techniques provide a potent and adaptable solution for resolving challenging optimization issues. They are highly adapted for dealing with the non-linear connections, uncertainties, and multi-objective situations that arise in real-world problems. The limits of computational intelligence have recently been pushed by recent developments in hybrid techniques and metaheuristics, even if obstacles in algorithm design and parameter tuning still exist. Computational intelligence is anticipated to play an increasingly significant role in tackling complicated optimization issues and fostering innovation across a variety of disciplines as technology continues to advance

    An introduction to programming Physics-Informed Neural Network-based computational solid mechanics

    Full text link
    Physics-informed neural network (PINN) has recently gained increasing interest in computational mechanics. In this work, we present a detailed introduction to programming PINN-based computational solid mechanics. Besides, two prevailingly used physics-informed loss functions for PINN-based computational solid mechanics are summarised. Moreover, numerical examples ranging from 1D to 3D solid problems are presented to show the performance of PINN-based computational solid mechanics. The programs are built via Python coding language and TensorFlow library with step-by-step explanations. It is worth highlighting that PINN-based computational mechanics is easy to implement and can be extended for more challenging applications. This work aims to help the researchers who are interested in the PINN-based solid mechanics solver to have a clear insight into this emerging area. The programs for all the numerical examples presented in this work are available on https://github.com/JinshuaiBai/PINN_Comp_Mech.Comment: 32 pages, 20 figures are include in this manuscrip

    Data Privacy and Security in Cloud Computing Environments

    Get PDF
    The globe has adopted the cloud computing environment, which organizes data and manages space for data storage, processing, and access. This technical development has brought up questions regarding data security and privacy in cloud computing environments, though. The purpose of this abstract is to offer a thorough review of the issues, solutions, and future developments related to data privacy and security in cloud computing. Keeping data private and secure while it is being processed and stored in outside data centres is the main difficulty in cloud computing systems. The abstract discusses the dangers of insider threats, data breaches, and illegal access to sensitive information. It digs further into the legal and compliance criteria that businesses must follow in order to protect user data in the cloud. In result, data privacy and security in cloud computing environments remain critical concerns for organizations and individuals alike. In the survey the overview of how to use cloud storage globally and its challenges, solution and future innovation is well explained. It underscores the importance of robust encryption, access controls, user awareness, and emerging technologies in safeguarding data in the cloud. By addressing these concerns, organizations can leverage the power of cloud computing while maintaining the confidentiality, integrity, and availability of their data
    corecore